Our main goal for this week was to apply Jovo’s low-rank Kalman filter to a dataset and check its effect on discriminability. We spent a good amount of time going through each of the provided code files and understanding what each one was doing. We then moved on to replicating the experiments. We began by first running the parameter estimation and KFS algorithm code on simulated data, as per the specifications in the code package. Next, we tested these algorithms on the given real data using the provided scripts. The resulting plots were the same as earlier testing found here.
Finally, we modified some R and MATLAB code to be able to run Jovo’s algorithms on an actual full dataset:
require(R.matlab)
source('C:/Users/Eric/Documents/Github/fngs/data_processing/Rutils/open_timeseries.R')
fnames = list.files('2_JHU/Research/Data/BNU1', pattern="\\.rds", full.names=TRUE)
parsed = open_timeseries(fnames, sub_pos=3, run_pos=4)
ids = parsed[[3]]
ts = parsed[[1]]
nSubs = length(ids)
nRois = dim(ts[[1]])[2]
nStps = dim(ts[[1]])[1]
wgraphs = array(dim=c(nStps, nRois, nSubs))
for(i in 1:nSubs) {wgraphs[,,i] <- ts[[i]][1:nStps, 1:nRois]}
for(i in 1:nSubs) {
writeMat(con=paste('2_JHU/Research/Data/BNU1/timeseriesdata-',i,'.mat',sep=''),y=wgraphs[,,i],n=nRois)
}
for i = 1:106
load(strcat('Data/BNU1/timeseriesdata-',int2str(i),'.mat'));
y = y';
[a,c,q,r,pi,v] = kfs_learn(y,eye(n),eye(n),eye(n),eye(n),y(:,1),eye(n),1e-6,20);
[Fv1,Fv2,Fx1,Fx2,Sx,Sv,Scov] = KFS(a,c,q,r,pi,v,y);
save(strcat('Data/BNU1/KFS/filtered-',int2str(i),'.mat'));
end
require("ggplot2")
require("reshape2")
source('C:/Users/Eric/Documents/GitHub/Reliability/Code/FlashRupdated/functions/distance.R')
source('C:/Users/Eric/Documents/GitHub/Reliability/Code/FlashRupdated/functions/reliability.R')
source('C:/Users/Eric/Documents/GitHub/Reliability/Code/R/processing/hell_dist.R')
source('C:/Users/Eric/Documents/R/prod_pkde.R')
source('C:/Users/Eric/Documents/R/multiplot.R')
vars = vector("list", nSubs)
for (i in 1:nSubs) {
vars[[i]] = readMat(paste('2_JHU/Research/Data/BNU1/KFS/filtered-',i,'.mat',sep=''))
}
c = array(dim=c(nRois, nRois, nSubs))
for (i in 1:nSubs) {c[,,i] = cor(t(vars[[i]]$Sx))}
dist = distance(c)
m = mnr(rdf(dist, ids))
pdistc_kf <- ggplot(melt(dist), aes(x=Var2, y=Var1, fill=value)) + geom_tile(color="white") +
scale_fill_gradientn(colours=c("darkblue","blue","purple","green","yellow"),name="Distance") + labs(x="Scan", y="Scan", title=paste("MNR = ", round(m, digits=4)))
pkdec_kf = prod_pkde(dist, ids)
multiplot(pdistc_kf, pkdec_kf, cols=2)
We ran this for 20 EM iterations as well as 100 iterations. More iterations resulted in worse discriminability.
BNU_1
BNU_1 Kalman Filtered (20 iter)
BNU_1 Kalman Smoothed (20 iter)
BNU_1 Kalman Filtered (100 iter)
BNU_1 Kalman Smoothed (100 iter)
For the NKI and SWU datasets, several of the subject files encountered singular matrix errors during the KFS EM. This led to bad results for those scans, signified by the 0 distance in the discriminability plots.
NKI
NKI Kalman Filtered (20 iter)
NKI w/ Bad Scans Removed (BSR)
NKI w/ BSR Kalman Filtered (20 iter)
SWU_1
SWU_1 Kalman Filtered (20 iter)
SWU_1 w/ Bad Scans Removed (BSR)
SWU_1 w/ BSR Kalman Filtered (20 iter)
It appears as though the kalman filtering did not improve discriminability. For the BNU dataset, we first estimated the parameters using EM with 20 iterations. The discriminability after filtering with these parameters was actually lower than it was before filtering.